# Weakly Supervised Pretraining
Wspalign Mbert Base
WSPAlign is a weakly supervised span prediction-based word alignment pretraining model that supports word alignment tasks between multiple languages.
Machine Translation
Transformers Supports Multiple Languages

W
qiyuw
200
0
Tapas Mini
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text, pretrained in a self-supervised manner on Wikipedia table data.
Large Language Model
Transformers English

T
google
15
0
Tapas Base
Apache-2.0
A table understanding model based on BERT architecture, pretrained on Wikipedia table data through self-supervised learning, supporting table question answering and statement verification tasks
Large Language Model
Transformers English

T
google
2,457
7
Tapas Base Finetuned Sqa
Apache-2.0
A table question answering model based on BERT architecture, enhanced with intermediate pretraining for numerical reasoning, fine-tuned on the SQA dataset
Question Answering System
Transformers English

T
google
1,867
6
Tapas Medium
Apache-2.0
A table-based question answering model based on the Transformer architecture, pretrained in a self-supervised manner on English Wikipedia tables and associated text
Large Language Model
Transformers English

T
google
23
0
Tapas Tiny Finetuned Wtq
Apache-2.0
TAPAS is a tiny Transformer model optimized for table question answering tasks, achieving table comprehension capabilities through intermediate pretraining and chained multi-dataset fine-tuning
Question Answering System
Transformers English

T
google
1,894
1
Featured Recommended AI Models